Information Geometric Approach to Bayesian Lower Error Bounds
نویسندگان
چکیده
Information geometry describes a framework where probability densities can be viewed as differential geometry structures. This approach has shown that the geometry in the space of probability distributions that are parameterized by their covariance matrix is linked to the fundamentals concepts of estimation theory. In particular, prior work proposes a Riemannian metric the distance between the parameterized probability distributions that is equivalent to the Fisher Information Matrix, and helpful in obtaining the deterministic Cramér-Rao lower bound (CRLB). Recent work in this framework has led to establishing links with several practical applications. However, classical CRLB is useful only for unbiased estimators and inaccurately predicts the mean square error in low signal-to-noise (SNR) scenarios. In this paper, we propose a general Riemannian metric that, at once, is used to obtain both Bayesian CRLB and deterministic CRLB along with their vector parameter extensions. We also extend our results to the Barankin bound, thereby enhancing their applicability to low SNR situations.
منابع مشابه
A New Approach of Deriving Bounds between Entropy and Error from Joint Distribution: Case Study for Binary Classifications
The existing upper and lower bounds between entropy and error are mostly derived through an inequality means without linking to joint distributions. In fact, from either theoretical or application viewpoint, there exists a need to achieve a complete set of interpretations to the bounds in relation to joint distributions. For this reason, in this work we propose a new approach of deriving the bo...
متن کاملBayesian Error Based Sequences of Mutual Information Bounds
The inverse relation between mutual information (MI) and Bayesian error is sharpened by deriving finite sequences of upper and lower bounds on MI in terms of the minimum probability of error (MPE) and related Bayesian quantities. The well known Fano upper bound and Feder-Merhav lower bound on equivocation are tightened by including a succession of posterior probabilities starting at the largest...
متن کاملAn Optimization Approach of Deriving Bounds between Entropy and Error from Joint Distribution: Case Study for Binary Classifications
In this work, we propose a new approach of deriving the bounds between entropy and error from a joint distribution through an optimization means. The specific case study is given on binary classifications. Two basic types of classification errors are investigated, namely, the Bayesian and non-Bayesian errors. The consideration of non-Bayesian errors is due to the facts that most classifiers res...
متن کاملLower Bounds on the Bayes Risk of the Bayesian BTL Model with Applications to Random Graphs
We consider the problem of aggregating pairwise comparisons to obtain a consensus ranking order over a collection of objects. We employ the popular Bradley-Terry-Luce (BTL) model in which each object is associated with a skill parameter which allows us to probabilistically describe the pairwise comparisons between objects. In particular, we employ the Bayesian BTL model which allows for meaning...
متن کاملGeneral classes of performance lower bounds for parameter estimation: part II: Bayesian bounds
In this paper, a new class of Bayesian lower bounds is proposed. Derivation of the proposed class is performed via projection of each entry of the vector-function to be estimated on a closed Hilbert subspace of L2. This Hilbert subspace contains linear transformations of elements in the domain of an integral transform, applied on functions used for computation of bounds in the Weiss-Weinstein c...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1801.04658 شماره
صفحات -
تاریخ انتشار 2018